PAutomaC: a probabilistic automata and hidden Markov models learning competition
نویسندگان
چکیده
منابع مشابه
Links between probabilistic automata and hidden Markov models: probability distributions, learning models and induction algorithms
This article presents an overview of Probabilistic Automata (PA) and discrete Hidden Markov Models (HMMs), and aims at clarifying the links between them. The first part of this work concentrates on probability distributions generated by these models. Necessary and sufficient conditions for an automaton to define a probabilistic language are detailed. It is proved that probabilistic deterministi...
متن کاملResults of the PAutomaC Probabilistic Automaton Learning Competition
Approximating distributions over strings is a hard learning problem. Typical GI techniques involve using finite state machines as models and attempting to learn both the structure and the weights, simultaneously. The PAutomaC competition is the first challenge to allow comparison between methods and algorithms and builds a first state of the art for these techniques. Both artificial data and re...
متن کاملPAutomaC: a PFA/HMM Learning Competition
Approximating distributions over strings is a hard learning problem. Typical techniques involve using finite state machines as models and attempting to learn these; these machines can either be hand built and then have their weights estimated, or built by grammatical inference techniques: the structure and the weights are then learned simultaneously. The PAutomaC competition, run in 2012, was t...
متن کاملLearning Hidden Quantum Markov Models
Hidden Quantum Markov Models (HQMMs) can be thought of as quantum probabilistic graphical models that can model sequential data. We extend previous work on HQMMs with three contributions: (1) we show how classical hidden Markov models (HMMs) can be simulated on a quantum circuit, (2) we reformulate HQMMs by relaxing the constraints for modeling HMMs on quantum circuits, and (3) we present a lea...
متن کاملLearning Imprecise Hidden Markov Models
Consider a stationary precise hidden Markov model (HMM) with n hidden states Xk, taking values xk in a set {1, . . . ,m} and n observations Ok, taking values ok. Both the marginal model pX1(x1), the emission models pOk|Xk(ok|xk) and the transition models pXk|Xk−1(xk|xk−1) are unknown. We can then use the Baum–Welch algorithm [see, e.g., 4] to get a maximum-likelihood estimate of these models. T...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Machine Learning
سال: 2013
ISSN: 0885-6125,1573-0565
DOI: 10.1007/s10994-013-5409-9